The OCS-SVM: An Objective-Cost-Sensitive SVM With Sample-Based Misclassification Cost Invariance
نویسندگان
چکیده
منابع مشابه
Cost-Sensitive Learning of SVM for Ranking
In this paper, we propose a new method for learning to rank. ‘Ranking SVM’ is a method for performing the task. It formulizes the problem as that of binary classification on instance pairs and performs the classification by means of Support Vector Machines (SVM). In Ranking SVM, the losses for incorrect classifications of instance pairs between different rank pairs are defined as the same. We n...
متن کاملThe 2ν-SVM: A Cost-Sensitive Extension of the ν-SVM
Standard classification algorithms aim to minimize the probability of making an incorrect classification. In many important applications, however, some kinds of errors are more important than others. In this report we review cost-sensitive extensions of standard support vector machines (SVMs). In particular, we describe cost-sensitive extensions of the C-SVM and the ν-SVM, which we denote the 2...
متن کاملAdaCost: Misclassification Cost-Sensitive Boosting
AdaCost, a variant of AdaBoost, is a misclassification cost-sensitive boosting method. It uses the cost of misclassifications to update the training distribution on successive boosting rounds. The purpose is to reduce the cumulative misclassification cost more than AdaBoost. We formally show that AdaCost reduces the upper bound of cumulative misclassification cost of the training set. Empirical...
متن کاملBi-Parameter Space Partition for Cost-Sensitive SVM
Model selection is an important problem of costsensitive SVM (CS-SVM). Although using solution path to find global optimal parameters is a powerful method for model selection, it is a challenge to extend the framework to solve two regularization parameters of CS-SVM simultaneously. To overcome this challenge, we make three main steps in this paper. (i) A critical-regions-based biparameter space...
متن کاملPerceptron and SVM learning with generalized cost models
Learning algorithms from the fields of artificial neural networks and machine learning, typically, do not take any costs into account or allow only costs depending on the classes of the examples that are used for learning. As an extension of class dependent costs, we consider costs that are example, i.e. feature and class dependent. We derive a costsensitive perceptron learning rule for non-sep...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEE Access
سال: 2019
ISSN: 2169-3536
DOI: 10.1109/access.2019.2933437